专利摘要:
Method (400) and control unit (310) in a vehicle (100) comprising a digital mirror (300), for detecting an object (200) situated outside a default view (a) of a display (320-1, 320-2) of the digital mirror (300) and visualising the detected object (200) for a driver of the vehicle (100). The method (400) comprises detecting (401) the object (200), situated outside the default view (a) of the display (320-1, 320-2), by a sensor (120-1, 120-2); and adjusting (403) the field of view of a subset (340) of the display (320-1, 320-2) by widening the viewing angle so that the object (200) becomes visible for the driver in the subset (340) of the display (320-1, 320-2), while the main portion (330) of the display (320-1, 320-2) outputs the default view (a) at a default viewing angle.
公开号:SE1650924A1
申请号:SE1650924
申请日:2016-06-28
公开日:2017-12-29
发明作者:Claezon Fredrich;Lindberg Mikael
申请人:Scania Cv Ab;
IPC主号:
专利说明:

1METHOD AND CONTROL UNIT FOR A DIGITAL MIRROR TECHNICAL FIELD This document relates to a method and a control unit in a vehicle. More particularly, amethod and a control unit is described, for detecting and tracking an object situated in ablind spot of a rear view mirror of the vehicle.
BACKGROUND There are areas around a vehicle that are not visible for the driver, neither by direct visionnor indirectly with the help of a mirror. These areas are sometimes referred to as “blindspots” of the driver. The problems with visibility is in particular substantial in heavy vehiclessuch as trucks, and in particular for trucks with trailers.
The problem is escalated when the vehicle is turning. There is then a risk of an accident ifthe driver of the vehicle changes direction when an object is situated in such blind spot.
Various attempts have been made to solve this problem, e.g. by enlarging mirrors, by add-ing extra mirrors or by adding sensors, like e.g. cameras, covering the blind spots andgenerating a warning signal on a display or similar. Adding additional mirrors and/ or sen-sors is however expensive. lt is also difficult to install in a convenient way and present theinformation concerning coverage of the blind spot to the driver in a convenient manner. Toiteratively trigger alerting signals just because another vehicle for the moment is not visiblein the mirror may be perceived as disturbing and even distracting by the driver. Further,additional and/ or larger vehicle external mirrors will add air resistance and thus also in-crease fuel consumption. Also the driver's direct visibility in the direction of the rear viewmirrors will be further limited by the mirrors themselves.
Another known solution is to use wide-angle mirrors in addition to the standard rear viewmirrors. Thereby blind spots around the vehicle may be at least partly covered, but unfortu-nately wide-angle mirrors affect the perspective, as perceived by the driver. Thereby ob-jects situated close to the wide-angle mirror seems bigger/ closer than they are in reality,while objects situated further away seems smaller/ more distant than they are in reality.Such distorted view may confuse or disorientate the driver, which may lead to accidentsdue to inappropriate driver reactions on observed objects. Also, the driver has to checkboth mirrors, which takes time and may even present an ergonomic problem lor the driver. lt is also known to replace a conventional rear view mirror of a vehicle with a pair of cam-eras (situated outside the vehicle) and a corresponding display (situated in the cabin). Thisarrangement may sometimes be referred to as a digital rear view mirror. An advantage is 2that air resistance may be reduced, as the camera is considerably smaller than a rear view mirror. However, the above discussed problems with blind spots around the vehicle are notsolved only by making such replacement.
A problem that however appears when exchanging mirrors for cameras is that a digital rearview mirror has a fixed field of view which does not change when the driver changes theeye position. This leads to a situation where the driver is not able to keep track of anotherobject passing the own vehicle by leaning forvvard/ backwards, as is the case when usingconventional mirrors. The camera will capture and display the same field of view independ-ently of any driver movements and eye positions. Thereby a possibly dangerous situation iscreated when exchanging rear view mirrors with digital rear view mirrors.
The document US 20050111117 relates to automatic viewing of vehicle blind spot. A vehi-cle has rear view mirrors which may be redirected by a motor. A sensor may detect an ob-ject in a blind spot of the vehicle. The motor may then redirect the mirror to provide a viewof the blind spot (with the detected object) to the driver.
A disadvantage with the disclosed solution is that it is based on conventional rear view mir-rors, having the above mentioned disadvantages of high air resistance in comparison witha digital mirror solution. Further, the direct view of the driver in the direction of each rearview mirror is affected. Another problem with changing the direction of the rear view mirroris that the driver may become uncertain of what is actually displayed in the mirror andwhere the object is positioned in relation to the own vehicle.
Yet a disadvantage is that when driving in dense traffic, the vehicle may be surroundedwith several other vehicles, some within the field of view, some outside. When the mirror isredirected to capture the object in the blind spot, a new blind spot will emerge, that previ-ously was covered by the mirror. However, in case the driver is not aware of that vehicle,the situation after having redirected the mirror may be as dangerous as before, but withanother object in another blind spot.
Document US 6193380 presents a solution similar to US 20050111117, but with an auto-matic return of the mirror to a default position when the object is no longer present in the vehicle blind spot.
This solution shares the same disadvantages as the previously discussed solution indocument US 20050111117.
Document US 20140071278 describes a solution for blind-spot detection and collision 3avoidance where a blind-spot camera is located between a side-view mirror and the vehicle body. The rear view mirror has an integrated display that displays an indication of objectsdetected in blind spots. The camera may then follow the object as it is moving about theperimeter of the driver's vehicle.
Also this solution shares the above mentioned disadvantages of conventional vehicle ex-ternal mirrors, but makes it easier for the driver to distinct between the normal view of therear view mirror (which is reflected in the mirror) and the blind spot camera view (which ispresented on the display). lt would thus be desired to improve digital rear view mirrors of vehicles in order to reduceproblems associated with conventional rear view mirrors and blind spots and thereby en-hance traffic safety by improving the visibility of the driver.
SUMMARY lt is therefore an object of this invention to solve at least some of the above problems andimprove traffic safety.
According to a first aspect of the invention, this objective is achieved by a method in a ve-hicle comprising a digital mirror. The method aims at detecting an object situated outside adefault view of a display of the digital mirror and visualising the detected object for a driverof the vehicle. The method comprises detecting the object, situated outside the default viewof the display, by a sensor. Further, the method also comprises adjusting the field of viewof a subset of the display by widening the viewing angle so that the object becomes visiblefor the driver in the subset of the display, while the main portion of the display outputs thedefault view at a default viewing angle.
According to a second aspect of the invention, this objective is achieved by a control unit ina vehicle comprising a digital mirror. The control unit aims at detecting an object situatedoutside a default view of a display of the digital mirror and visualising the detected objectfor a driver of the vehicle. The control unit is configured to detect the object, situated out-side a default view of the display, based on signals received from a sensor. Further thecontrol unit is configured to generate control signals to adjust the field of view of a subset ofthe display by widening the viewing angle so that the object becomes visible for the driverin the subset of the display, while the main portion of the display outputs the default view ata default viewing angle.
Thanks to the described aspects, by adjusting the field of view of a subset of the display bywidening the viewing angle so that the object becomes visible for the driver in the subset of 4the display while the main portion of the display outputs the default view at a default view- ing angle, it becomes possible for the driver to detect an object in a blind spot of the driver,while still being able to see the standard view along the side of the vehicle. Thereby therisk of drivers being disoriented is eliminated or at least reduced, while the driver's attentionis directed to the object in the blind spot. lt is thereby also easier for the driver to correctlyestimate the distance between the own vehicle and the detected object. Also, a broaderarea may be covered and a situation where a plurality of objects is situated behind and/ orat the side of the vehicle may be presented for the driver. Further, the presented solutionmay be realised without additional sensors, besides the sensors already provided on thevehicle for other purposes. Thereby the problems of blind spots around the vehicle may beeliminated or at least reduced without increased sensor costs. Thus increased traffic safetyis achieved.
Other advantages and additional novel features will become apparent from the subsequentdetailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to theaccompanying figures, in which: Figure 1 illustrates a vehicle according to an embodiment of the invention; Figure 2A illustrates a vehicle according to an embodiment, as regarded from above;Figure 2B illustrates a vehicle according to an embodiment, as regarded from above;Figure 2C illustrates a side view of a vehicle according to an embodiment; Figure 2D illustrates a side view of a vehicle according to an embodiment; Figure 3 illustrates an example of a vehicle interior according to an embodiment;Figure 4 is a flow chart illustrating an embodiment of the method; Figure 5 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method and a control unit,which may be put into practice in the embodiments described below. These embodimentsmay, however, be exemplified and realised in many different forms and are not to be lim-ited to the examples set forth herein; rather, these illustrative examples of embodimentsare provided so that this disclosure will be thorough and complete. 5Still other objects and features may become apparent from the following detailed descrip- tion, considered in conjunction with the accompanying drawings. lt is to be understood,however, that the drawings are designed solely for purposes of illustration and not as adefinition of the limits of the herein disclosed embodiments, for which reference is to bemade to the appended claims. Further, the drawings are not necessarily drawn to scaleand, unless otherwise indicated, they are merely intended to conceptually illustrate thestructures and procedures described herein.
Figure 1 illustrates a scenario with a vehicle 100. The vehicle 100 is driving on a road in adriving direction 105.
The vehicle 100 may comprise e.g. a truck, a bus, a car or any similar vehicle or othermeans of conveyance.
The vehicle 100 may comprise a camera 110, directed to capture images at the back andthe side of the vehicle 100. The camera 110 may comprise a video camera, or a cameraconfigured for streaming images. The camera 110 is part of a digital mirror, replacing thereverse mirror on the two respective sides of the vehicle 100, together with a connectedcontrol unit and a display, outputting images captured by the camera 110, possibly imageprocessed by the control unit.
The vehicle 100 furthermore comprises a sensor 120-1, 120-2. The sensors 120-1, 120-2may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a ra-dar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different em-bodiments. The sensors 120-1, 120-2 may be directed to a left and a right side of the vehi-cle 100, respectively, as regarded in the driving direction 105 of the vehicle 100. Thereby,the sensors 120-1, 120-2 may detect any objects appearing at the respective side of thevehicle 100. The sensor 120-1, 120-2 may in some embodiments also determine a dis-tance to the object, a direction to the object, a movement direction of the object and/ or avelocity of the object, in different embodiments. ln some embodiments, the sensors 120-1, 120-2 may be mounted in the cabin behind thewindshield in some embodiments, which has some advantages compared to externallymounted sensor systems. These advantages include protection from dirt, snow, rain and tosome extent also from damage, vandalism and/ or theft. Such sensor 120-1, 120-2 mayalso be used for a variety of other tasks.
Thereby, the side view sensor 120-1, 120-2 may identify e.g. if another vehicle is on its wayto disappear into a blind spot of the driver, around the vehicle 100. lnstead of using traditional rear view mirrors on the vehicle 100, the camera 110, upon cap-turing an image or a stream of images, may output it/ them on a display intended to displayobjects outside a driver's direct field of vision. Such display may comprise e.g. a display inthe cabin of the vehicle 100, a projector, a Head-Up Display, a transparent display beingpart of the windshield, intelligent glasses of the driver, etc., which output an image, orstream of images, captured by a corresponding camera 110. Typically, the camera on theleft side of the vehicle 100 may be associated with a presentational device on the left sideof the cabin while the camera 110 on the right side of the vehicle 100 may be associatedwith a presentational device on the right side of the cabin, even if other combinations arepossible.
By using the environmental sensors 120-1, 120-2 such as cameras, radars, lidars, ultra-sonics etc., with object detection and tracking capabilities it is possible to automaticallyadjust the field of view of the displays so that the driver is able to keep track of nearby ob-jects in the digital rear view mirrors.
The adjustment of the field of view of a subset of the display by widening the viewing angleso that the object becomes visible for the driver in the subset of the display, while the mainportion of the display outputs the default view at a default viewing angle.
The driver is thereby able to see relevant objects in the digital rear view mirrors whichwould otherwise be outside of the field of view of the driver. The field of view of the driver isoptimised to the driving situation creating a relaxed and safe driving environment.
An advantage with the method is that the camera 110 (and thereby the view in the digitalmirror) is not redirected to follow the detected object. lt is thereby avoided that the driver isdisoriented and becomes uncertain of what is displayed in the digital mirror. The driver isable to watch the rear part of the vehicle 100. By not widening the field of view of the cam-era 110/ display, but instead keeping a normal viewing angle and instead follow the de-tected and identified object, the main part of the outputted image is not distorted. lt isthereby easy for the driver to estimate the distance to the object, unlike the case when thefield of view is widened over all the display.
Further, in some embodiments, a detected object around the vehicle 100 may be indicatedon an overview presentation, e.g. on a display in the cabin, or in any alternative presenta- tional device.
Traffic safety is thereby enhanced.
Figure 2A schematically illustrates a scenario, similar to the previously discussed scenarioillustrated in Figure 1, but with the vehicle 100 seen from an above perspective andwherein an over taking object 200 is depicted.
The object 200 may be a vehicle, a human, an animal, a lamp post or e.g. any imaginablestatic or dynamic item.
When the vehicle 100 is driving in the driving direction 105, the side directed sensor 120-2may detect the object 200 in this illustrated scenario, also when it cannot be regarded nei-ther by direct sight of the driver, nor by the field of view u, captured by the camera 110-2.The object 200 is thus situated in the previously discussed blind spot of the driver and adangerous traffic situation may emerge in case the driver is not aware of the object 200and decides to change driving lanes.
Figure 2B schematically illustrates a scenario following the scenario illustrated in Figure2A in some embodiments.
The object 200 is captured within an extended field of view ß of the camera 110-2 and out-putted at a subset of the left side display.
The driver is thereby enabled to detect the object 200 without redirection of the camera110-2, as the normal field of view oi of the camera 110-2 is outputted on a main section ofthe display while a subset of the display outputs the extended field of view ß of the camera110-2.
Figure 2C schematically illustrates a scenario, similar to the previously discussed scenarioillustrated in Figure 1, but with the vehicle 100 seen from an above perspective andwherein an over taking (rather low) object 200 is depicted.
The camera 110-2 is not able to capture the object 200 within the normal field of view of.However, the object 200 is detected by the sensor 120-2.
The object 200 is at this moment thus situated in a blind spot of the driver, which per se isnot dangerous but a dangerous situation may emerge in case the driver of the vehicle 100decides to change lanes.
Figure 2D schematically illustrates a scenario following the scenario illustrated in Figure2G in some embodiments.
The object 200 is captured within an extended field of view ß of the camera 110-2 and out-putted at a subset of the left side display.
The driver is thereby enabled to detect the object 200 without redirection of the camera110-2, as the normal field of view d of the camera 110-2 is outputted on a main section ofthe display while a subset of the display outputs the extended field of view ß of the camera110-2.
Thereby, the driver becomes aware of the object 200 at the left side of the own vehicle 100and is enabled to continue the driving with that vehicle in mind. By maintaining the sameviewing angle of the camera 110-2 and the presentational device, distance estimation ofthe object 200 in relation to the own vehicle 100 is facilitated for the driver. lt may be noted that the vehicle 100 may have additional sensors in some embodiments,which may be of the same or different types.
Further, in some embodiments, the field of view oi, ß of the camera 110-2/ display may re-turn to the normal field of view oi in some embodiments, e.g. when the detected object 200no longer is situated at the side of the vehicle 100 and/ or when another object enters thezone at the left side of the vehicle 100.
Figure 3 illustrates an example of a vehicle interior of the vehicle 100 and depicts how thepreviously scenario in Figure 1 and/ or Figure 2A-2D may be perceived by the driver of thevehicle 100.
The vehicle 100 comprises a control unit 310, a right side display 320-1 intended to displayobjects 200 outside a driver's direct field of vision, situated on the right side of the vehicle100, and a left side display 320-2 intended to display objects 200 outside a driver's directfield of vision, situated on the left side of the vehicle 100. Each such display 320-1, 320-2 isassociated with a respective camera 110-1, 110-2, situated on the corresponding side ofthe vehicle 100. The cameras 110-1, 110-2 may typically comprise a respective video Camefa.
Thus the vehicle comprises digital rear view mirrors 300, which comprises one display 320-1, 320-2, one camera 110-1, 110-2 and the control unit 310.
The display 320-1, 320-2 may in some alternative embodiments be complemented or re-placed by another presentational device such as e.g. a display, a loudspeaker, a projector, 9a head-up display, a display integrated in the windshield of the vehicle 100, a display inte- grated in the dashboard of the vehicle 100, a tactile device, a portable device of the vehicledriver/ owner, intelligent glasses of the vehicle driver/ owner, etc.; or a combination thereof.
However, in some embodiments, the vehicle 100 may comprise a plurality of sensors 120-1, 120-2 on each side of the vehicle 100 for detecting objects 200. The sensors 120-1, 120-2 may be of the same, or different types, such as e.g. a camera, a stereo camera, an infra-red camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flíght camera,or similar device in different embodiments.
The sensors 120-1, 120-2 and the cameras 110-1, 110-2 may be the same devices insome alternative embodiments.
The displays 320-1, 320-2 each has a respective main portion 330, wherein a normal fieldof view d of the respective camera 110-1, 110-2 may be outputted; and a subset 340,where the extended field of view ß of the respective camera 110-1, 110-2 may be dis-played, in case an object 200 is detected. The extended field of view ß thus capture a widerview of the environmental reality than the normal field of view d. However, when no object200 is detected, the normal field of view d of the respective camera 110-1, 110-2 may beoutputted at both the main portion 330 and the subset 340 of the displays 320-1, 320-2.
The control unit 310 is able to detect the object 200, situated out of the normal field of viewoi, e.g. based on signals received from the sensor 120-1, 120-2. Upon such detection, theextended field of view ß of the respective camera 110-1, 110-2 may be displayed in thesubset 340 of the display 320-1, 320-2.
As the object 200 moves in relation to the vehicle 100, the extended field of view ß may beadjusted, in some embodiments. ln some embodiments, when the object 200 is detected in the blind spot of the driver, thedriver's attention may be caught, besides tracking the object 200 with said devices 320-1,320-2/ cameras 110-1, 110-2, by an audio signal, light signal, haptic signal etc.
Thereby the risk of an accident due to an object 200 appearing in a blind spot of the driveris reduced, as the driver is made aware of the object 200 and its position in relation to theown vehicle 100.
The control unit 310 may communicate with the sensors 120-1, 120-2, cameras 110-1,110-2 and displays 320-1, 320-2 e.g. via a wired or wireless communication bus of the ve- hicle 100, or via a wired or wireless connection. The communication bus may comprise e.g. a Controller Area Network (CAN) bus, a Media Oriented Systems Transport (MOST) bus,or similar. However, the communication may alternatively be made over a wireless connec-tion comprising, or at least be inspired by any of the previously discussed wireless commu-nication technologies.
Figure 4 illustrates an example of a method 400 according to an embodiment. The flowchart in Figure 4 shows the method 400 for use in a vehicle 100 comprising a digital mirror300. The method 400 aims at detecting an object 200 situated outside a default view o of adisplay 320-1, 320-2 of the digital mirror 300 and visualising the detected object 200 for adriver of the vehicle 100.
The vehicle 100 may be e.g. a truck, a bus, a car, or similar means of conveyance.
The vehicle 100 may comprise a plurality of sensors 120-1, 120-2 pointable towards theobject 200, in some embodiments, simultaneously, shifted or sequentially in time. ln order to correctly be able to detect and visualise the object 200, the method 400 maycomprise a number of steps 401-405. However, some of these steps 401-405 may be per-formed in various alternative manners. Some method steps may only be performed insome optional embodiments; such as e.g. steps 402. Further, the described steps 401-405may be performed in a somewhat different chronological order than the numbering sug-gests. The method 400 may comprise the subsequent steps: Step 401 comprises detecting the object 200, situated outside the default view oi of thedisplay 320-1, 320-2, intended to display objects 200 outside a driver's direct field of vision,by a sensor 120-1, 120-2. ln some embodiments, the location of the detected object 200 in relation to the vehicle 100may be determined.
Further, in some particular embodiments, the detection of the object 200 may also com-prise identifying the object 200 by image recognition.
Image recognition/ computer vision is a technical field comprising methods for acquiring,processing, analysing, and understanding images and, in general, high-dimensional datafrom the real world in order to produce numerical or symbolic information. A theme in thedevelopment of this field has been to duplicate the abilities of human vision by electroni-cally perceiving and understanding an image. Understanding in this context means the 11transformation of visual images (the input of retina) into descriptions of the world that can interface with other thought processes and elicit appropriate action. This image under-standing can be seen as the disentangling of symbolic information from image data usingmodels constructed with the aid of geometry, physics, statistics, and learning theory. Com-puter vision may also be described as the enterprise of automating and integrating a widerange of processes and representations for vision perception.
The image data of the cameras 110-1, 110-2 and/ or sensors 120-1, 120-2 may take manyforms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner.
Computer vision may comprise e.g. scene reconstruction, event detection, video tracking,object recognition, object pose estimation, learning, indexing, motion estimation, and imagerestoration, just to mention some examples.
Step 402, which only may be comprised in some embodiments, comprises determining therelative speed of the object 200 in relation to the vehicle 100, based on signals receivedfrom the sensor 120-1, 120-2.
Step 403 comprises adjusting the field of view of a subset 340 of the display 320-1, 320-2by widening the viewing angle so that the object 200 becomes visible for the driver in thesubset 340 of the display 320-1, 320-2, while the main portion 330 of the display 320-1,320-2 outputs the default view oi at a default viewing angle.
The adjustment of the field of view of the subset 340 of the display 320-1, 320-2 may insome embodiments be made at a pace corresponding to the determined relative speedbetween the vehicle 100 and the object 200. An advantage therewith is that the driver getsan intuitive feeling of the relative speed of the object 200.
Step 404 which only may be performed in some particular embodiments, comprises esti-mating that the object 200 is visible for the driver, either in the main portion 330 of the dis-play 320-1, 320-2, or by direct vision of the driver.
Step 405 which only may be performed in some particular embodiments, comprises adjust-ing the field of view of the subset 340 of the display 320-1, 320-2 to the default view oi atthe default viewing angle.
The adjustment of the field of view of the subset 340 of the display 320-1, 320-2 may insome embodiments be made at a pace corresponding to the determined relative speed. 12 Thereby, the default field of view oi is again outputted on the whole area of the display 320-1, 320-2.
Figure 5 illustrates an embodiment of a system 500 in a vehicle 100 comprising a digitalmirror 300, for detecting an object 200 situated outside a default view d of a display 320-1,320-2 of the digital mirror 300 and visualising the detected object 200 for a driver of thevehicle 100.
The system 500 may perform at least some of the previously described steps 401-405 ac-cording to the method 400 described above and illustrated in Figure 4.
The system 500 comprises at least one control unit 310 in the vehicle 100. The control unit310 is configured to detect the object 200, situated outside a default view of a display 320-1, 320-2, intended to display objects 200 outside a driver's direct field of vision, based onsignals received from a sensor 120-1, 120-2. Further, the control unit 310 is configured togenerate control signals to adjust the field of view of a subset 340 of the display 320-1,320-2 by widening the viewing angle so that the object 200 becomes visible for the driver inthe subset 340 of the display 320-1, 320-2, while the main portion 330 of the display 320-1,320-2 outputs the default view oi at a default viewing angle.
Further, the control unit 310 may be configured to estimate that the object 200 is visible forthe driver, either in the main portion 330 of the display 320-1, 320-2, or by direct vision. lnsome embodiments, the control unit 310 may be furthermore configured to generate controlsignals to adjust the field of view of the subset 340 of the display 320-1, 320-2 to the de- fault view oi at the default viewing angle.
The control unit 310 may in addition be further configured to determine the relative speedof the object 200 in relation to the vehicle 100. The control unit 310 may also be furtherconfigured to generate the control signals to adjust the field of view of the subset 340 of thedisplay 320-1, 320-2 at a pace corresponding to the determined relative speed.
Further, in some embodiments, the control unit 310 may be configured to widen the field ofview of the subset 340 of the display 320-1, 320-2 with a uniformly distributed viewing an-gle over the subset 340. ln some other alternative embodiments, the control unit 310 may be configured to widenthe field of view of the subset 340 of the display 320-1, 320-2 with an increasing viewingangle towards the outer edge of the subset 340 of the display 320-1, 320-2, situated most 13remotely from the main portion 330 of the display 320-1, 320-2.
Further the control unit 310 may be configured to determine the location of the detectedobject 200 in relation to the vehicle 100, based on received sensor signals. ln addition, thecontrol unit 310 may be configured to generate control signals for outputting informationindicating the determined location of the detected object 200 for the driver, in some em-bodiments.
The control unit 310 comprises a receiving circuit 510 configured for receiving a signal fromthe cameras 110-1, 110-2 and the sensors 120-1, 120-2.
Further, the control unit 310 comprises a processor 520 configured for performing at leastsome steps of the method 400, according to some embodiments.
Such processor 520 may comprise one or more instances of a processing circuit, i.e. aCentral Processing Unit (CPU), a processing unit, a processing circuit, an Application Spe-cific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may inter-pret and execute instructions. The herein utilised expression “processor” may thus repre-sent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any,some or all of the ones enumerated above.
Furthermore, the control unit 310 may comprise a memory 525 in some embodiments. Theoptional memory 525 may comprise a physical device utilised to store data or programs,i.e., sequences of instructions, on a temporary or permanent basis. According to some em-bodiments, the memory 525 may comprise integrated circuits comprising silicon-basedtransistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USBmemory, a hard disc, or another similar volatile or non-volatile storage unit for storing datasuch as e.g. ROIVI (Read-Only Memory), PROM (Programmable Read-Only Memory),EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different em-bodiments.
Further, the control unit 310 may comprise a signal transmitter 530 in some embodiments.The signal transmitter 530 may be configured for transmitting a signal to e.g. the display320-1, 320-2, and/ or a warning system or warning device, for example. ln addition, the system 500 also comprises at least one sensor 120-1, 120-2 of the vehicle100, for detecting the object 200 situated outside a default view or of a display 320-1, 320-2,intended to display objects outside a driver's direct field of vision. The at least one sensor120-1, 120-2 may comprise e.g. a camera, a stereo camera, an infrared camera, a video 14camera, radar, lidar, ultrasonic sensor, time- of- flight camera, or thermal camera or similar.
The at least one sensor 120-1, 120-2 utilised for performing at least a part of the method400 may in some embodiments have another main purpose than performing the method400, i.e. be already existing in the vehicle 100.
Further the system 500 also comprises at least one a digital mirror 300, comprising a cam-era 110-1, 110-2 for capturing a stream of images and a display 320-1, 320-2 for displayingthe captured stream of images of the corresponding camera 110-1, 110-2. The display320-1, 320-2 comprises a main portion 330 which always output a default view of at a de-fault viewing angle, and a subset of the display 320-1, 320-2, configured to alternately out-put the default view oi at a default viewing angle and a widened viewing angle.
The above described steps 401-405 to be performed in the vehicle 100 may be imple-mented through the one or more processors 520 within the control unit 310, together withcomputer program product for performing at least some of the functions of the steps 401-405. Thus a computer program product, comprising instructions for performing the steps401-405 in the control unit 310 may perform the method 400 comprising at least some ofthe steps 401 -405 for detecting the object 200 and visualising the detected object 200 for adriver of the vehicle 100, when the computer program is loaded into the one or more proc-essors 520 of the control unit 310.
Further, some embodiments of the invention may comprise a vehicle 100, comprising thecontrol unit 310, for detecting and visualising the object 200, according to at least some ofthe steps 401-405.
The computer program product mentioned above may be provided for instance in the formof a data carrier carrying computer program code for performing at least some of the steps401-405 according to some embodiments when being loaded into the one or more proces-sors 520 of the control unit 310. The data carrier may be, e.g., a hard disk, a CD ROM disc,a memory stick, an optical storage device, a magnetic storage device or any other appro-priate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be provided as com-puter program code on a server and downloaded to the control unit 310 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompa-nying drawings is not intended to be limiting of the described method 400; the control unit310; the computer program; the system 500 and/ or the vehicle 100. Various changes,substitutions and/ or alterations may be made, without departing from invention embodi- ments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more ofthe associated listed items. The term “or“ as used herein, is to be interpreted as a mathe-matical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR),unless expressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are tobe interpreted as “at least one", thus also possibly comprising a plurality of entities of thesame kind, unless expressly stated otherwise. lt will be further understood that the terms"inc|udes", "comprises", "including" and/ or "comprising", specifies the presence of statedfeatures, actions, integers, steps, operations, elements, and/ or components, but do notpreclude the presence or addition of one or more other features, actions, integers, steps,operations, elements, components, and/ or groups thereof. A single unit such as e.g. aprocessor may fulfil the functions of several items recited in the claims. The mere fact thatcertain measures are recited in mutually different dependent claims does not indicate that acombination of these measures cannot be used to advantage. A computer program may bestored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distrib-uted in other forms such as via Internet or other wired or wireless communication system.
权利要求:
Claims (10)
[1] 1. A method (400) in a vehicle (100) comprising a digital mirror (300), for detecting anobject (200) situated outside a default view (d) of a display (320-1, 320-2) of the digital mir-ror (300) and visualising the detected object (200) for a driver of the vehicle (100), whereinthe method (400) comprises: detecting (401) the object (200), situated outside the default view (d) of the display(320-1, 320-2), by a sensor (120-1, 120-2); and adjusting (403) the field of view of a subset (340) of the display (320-1, 320-2) bywidening the viewing angle so that the object (200) becomes visible for the driver in thesubset (340) of the display (320-1, 320-2), while the main portion (330) of the display (320-1, 320-2) outputs the default view (d) at a default viewing angle.
[2] 2. The method (400) according to claim 1,further comprising: estimating (404) that the object (200) is visible for the driver, either in the main por-tion (330) of the display (320-1, 320-2), or by direct vision; and adjusting (405) the field of view of the subset (340) of the display (320-1, 320-2) tothe default view (d) at the default viewing angle.
[3] 3. The method (400) according to any of claim 1 or claim 2, further comprising: determining (402) the relative speed of the object (200) in relation to the vehicle(100); and wherein the adjustment (403, 405) of the field of view of the subset (340) of the dis-play (320-1, 320-2) is made at a pace corresponding to the determined (402) relativespeed.
[4] 4. A control unit (310) in a vehicle (100) comprising a digital mirror (300), for detectingan object (200) situated outside a default view (d) of a display (320-1, 320-2) of the digitalmirror (300) and visualising the detected object (200) for a driver of the vehicle (100),wherein the control unit (310) is configured to: detect the object (200), situated outside a default view of the display (320-1, 320-2),based on signals received from a sensor (120-1, 120-2); and generate control signals to adjust the field of view of a subset (340) of the display(320-1, 320-2) by widening the viewing angle so that the object (200) becomes visible forthe driver in the subset (340) of the display (320-1, 320-2), while the main portion (330) ofthe display (320-1, 320-2) outputs the default view (d) at a default viewing angle.
[5] 5. The control unit (310) according to claim 4, further configured to:estimate that the object (200) is visible for the driver, either in the main portion (330)of the display (320-1, 320-2), or by direct vision; and 17generate control signals to adjust the field of view of the subset (340) of the display (320-1, 320-2) to the default view (d) at the default viewing angle.
[6] 6. The control unit (310) according to any of claim 4 or claim 5, further configured to:determine the relative speed of the object (200) in relation to the vehicle (100); andgenerate the control signals to adjust the field of view of the subset (340) of the display (320-1, 320-2) at a pace corresponding to the determined relative speed.
[7] 7. The control unit (310) according to any of claims 4-6, wherein the field of view of thesubset (340) of the display (320-1, 320-2) is widened with a uniformly distributed viewingangle over the subset (340).
[8] 8. The control unit (310) according to any of claims 4-6, wherein the field of view of thesubset (340) of the display (320-1, 320-2) is widened with an increasing viewing angle to-wards the outer edge of the subset (340) of the display (320-1, 320-2), situated most re-motely from the main portion (330) of the display (320-1, 320-2).
[9] 9. A computer program comprising program code for performing a method (400) ac-cording to any of claims 1-3 when the computer program is executed in a processor in acontrol unit (310), according to any of claims 4-8.
[10] 10. A system (500) in a vehicle (100) for detecting an object (200) situated outside adefault view (cr) of a display (320-1, 320-2) of the digital mirror (300) and visualising thedetected object (200) for a driver of the vehicle (100), which system (500) comprises: a control unit (310) according to claims 4-8; a digital mirror (300), comprising a camera (110-1, 110-2) for capturing a stream ofimages and a display (320-1, 320-2) for displaying the captured stream of images, whereinthe display (320-1, 320-2) comprises a main portion (330) which always output a defaultview (of) at a default viewing angle, and a subset of the display (320-1, 320-2), configuredto alternately output the default view (of) at a default viewing angle and a widened viewingangle; and at least one sensor (120-1, 120-2) of the vehicle (100), for detecting the object(200) situated outside the default view (of) of the display (320-1, 320-2).
类似技术:
公开号 | 公开日 | 专利标题
US9771083B2|2017-09-26|Cognitive displays
US10009580B2|2018-06-26|Method for supplementing a piece of object information assigned to an object and method for selecting objects in surroundings of a vehicle
US20160297362A1|2016-10-13|Vehicle exterior side-camera systems and methods
US8781731B2|2014-07-15|Adjusting method and system of intelligent vehicle imaging device
JP4872245B2|2012-02-08|Pedestrian recognition device
US20120093358A1|2012-04-19|Control of rear-view and side-view mirrors and camera-coordinated displays via eye gaze
CN107438538B|2021-05-04|Method for displaying the vehicle surroundings of a vehicle
JP6511015B2|2019-05-08|Vehicle monitoring system
JP6756228B2|2020-09-16|In-vehicle display control device
US20150124097A1|2015-05-07|Optical reproduction and detection system in a vehicle
JP6453929B2|2019-01-16|Vehicle display system and method for controlling vehicle display system
EP2626246A1|2013-08-14|Vehicular infrared night assistant driving system
JP2017056909A|2017-03-23|Vehicular image display device
US11050949B2|2021-06-29|Method and control unit for a digital rear view mirror
EP3414131B1|2020-09-09|System for reducing a blind spot for a vehicle
KR20180129044A|2018-12-05|Driver assistance apparatus in vehicle and method for guidance a safety driving thereof
US9283891B1|2016-03-15|Alert systems and methods using a transparent display
KR102100978B1|2020-04-14|Control unit and method for rear view
WO2018159017A1|2018-09-07|Vehicle display control device, vehicle display system, vehicle display control method and program
JP2008162550A|2008-07-17|External environment display device
CN110435539B|2021-01-05|Image display method of head-up display and head-up display system
JP5787168B2|2015-09-30|Obstacle alarm device
JP2022007157A|2022-01-13|Vehicle control device
US20190361533A1|2019-11-28|Automated Activation of a Vision Support System
JP6956473B2|2021-11-02|Sideways state judgment device
同族专利:
公开号 | 公开日
US11050949B2|2021-06-29|
WO2018004421A1|2018-01-04|
KR102130059B1|2020-07-06|
EP3475124A4|2020-02-26|
KR20190018708A|2019-02-25|
BR112018076187A2|2019-03-26|
EP3475124A1|2019-05-01|
US20190191107A1|2019-06-20|
CN109415018A|2019-03-01|
SE539981C2|2018-02-20|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5289321A|1993-02-12|1994-02-22|Secor James O|Consolidated rear view camera and display system for motor vehicle|
US6193380B1|1999-04-20|2001-02-27|Raymond A. Jacobs|Vehicle blind spot mirror|
WO2001064481A2|2000-03-02|2001-09-07|Donnelly Corporation|Video mirror systems incorporating an accessory module|
US6885968B2|2000-05-08|2005-04-26|Automotive Technologies International, Inc.|Vehicular exterior identification and monitoring system-agricultural product distribution|
AU5964001A|2000-05-08|2001-11-20|Automotive Tech Int|Vehicular blind spot identification and monitoring system|
US7354166B2|2003-11-25|2008-04-08|Temic Automotive Of North America, Inc.|Automatic viewing of vehicle blind spot|
US7477137B2|2005-06-23|2009-01-13|Mazda Motor Corporation|Blind-spot detection system for vehicle|
JP4955471B2|2007-07-02|2012-06-20|株式会社デンソー|Image display device and in-vehicle image display device|
US8694195B2|2007-12-04|2014-04-08|Volkswagen Ag|Motor vehicle having a wheel-view camera and method for controlling a wheel-view camera system|
US20100020170A1|2008-07-24|2010-01-28|Higgins-Luthman Michael J|Vehicle Imaging System|
US20100076683A1|2008-09-25|2010-03-25|Tech-Cast Mfg Corp.|Car and ship bling spot-free collision avoidance system|
US8207835B2|2008-10-19|2012-06-26|Micha Schwartz|Vehicle safety device|
EP2473871B1|2009-09-01|2015-03-11|Magna Mirrors Of America, Inc.|Imaging and display system for vehicle|
US20110115913A1|2009-11-17|2011-05-19|Werner Lang|Automated vehicle surrounding area monitor and display system|
JP5619873B2|2010-03-26|2014-11-05|本田技研工業株式会社|Device for supporting driving of a vehicle|
JP2011205513A|2010-03-26|2011-10-13|Aisin Seiki Co Ltd|Vehicle periphery monitoring device|
US20120062741A1|2010-09-03|2012-03-15|Cvg Management Corporation|Vehicle camera system|
DE102011010624B4|2011-02-08|2014-10-16|Mekra Lang Gmbh & Co. Kg|Display device for fields of view of a commercial vehicle|
KR101438892B1|2012-07-03|2014-09-05|현대자동차주식회사|Apparatus and method for displaying dead zone of vehicle|
FR2994551B1|2012-08-14|2015-12-18|Jean Claude Galland|METHODS AND DEVICES FOR INDIRECT VISION ENLARGEMENT OF THE DIRECT VISION FIELD OF THE DRIVER OF A MOTOR VEHICLE|
US9139135B2|2012-09-07|2015-09-22|Musaid A. ASSAF|System and method that minimizes hazards of blind spots while driving|
US9096175B1|2012-10-04|2015-08-04|Ervin Harris|Split screen rear view display|
US20140114534A1|2012-10-19|2014-04-24|GM Global Technology Operations LLC|Dynamic rearview mirror display features|
KR101481229B1|2012-10-23|2015-01-09|현대자동차주식회사|Method and system for adjusting side-mirror|
EP3013645B1|2013-06-26|2017-07-12|Conti Temic microelectronic GmbH|Mirror replacement device and vehicle|
US20140327775A1|2013-12-04|2014-11-06|Kyu Hwang Cho|Mirrorless Driving of Automotive Vehicle Using Digital Image Sensors and Touchscreen|
CN103935298B|2014-04-02|2016-02-17|华东交通大学|A kind of automobile rear view mirror without dead zone|
DE102014008687A1|2014-06-12|2015-12-17|GM Global Technology Operations LLC |Method for displaying vehicle surroundings information of a motor vehicle|
US10127463B2|2014-11-21|2018-11-13|Magna Electronics Inc.|Vehicle vision system with multiple cameras|
WO2016093535A1|2014-12-10|2016-06-16|엘지전자 주식회사|Vehicle display device and vehicle comprising same|
SE539443C2|2016-02-10|2017-09-26|Scania Cv Ab|System for reducing a blind spot for a vehicle|
SE541846C2|2016-02-10|2019-12-27|Scania Cv Ab|Method and control unit for rear view|
EP3487172A4|2016-07-13|2019-07-10|Sony Corporation|Image generation device, image generation method, and program|
EP3499878A4|2016-08-08|2021-01-20|Koito Manufacturing Co., Ltd.|Vehicle monitoring system employing plurality of cameras|CN108501806A|2018-02-02|2018-09-07|斑马网络技术有限公司|Reduce the automobile and its method of rearview mirror and reduction vision dead zone of vision dead zone|
CN112498243A|2020-11-12|2021-03-16|浙江合众新能源汽车有限公司|Self-adaptive adjusting method and device for exterior rearview mirror|
法律状态:
优先权:
申请号 | 申请日 | 专利标题
SE1650924A|SE539981C2|2016-06-28|2016-06-28|Method and control unit for a digital mirror|SE1650924A| SE539981C2|2016-06-28|2016-06-28|Method and control unit for a digital mirror|
US16/311,227| US11050949B2|2016-06-28|2017-06-12|Method and control unit for a digital rear view mirror|
BR112018076187-7A| BR112018076187A2|2016-06-28|2017-06-12|Method and control unit for a digital rearview mirror|
CN201780039997.3A| CN109415018A|2016-06-28|2017-06-12|Method and control unit for digital rearview mirror|
PCT/SE2017/050626| WO2018004421A1|2016-06-28|2017-06-12|Method and control unit for a digital rear view mirror|
EP17820640.5A| EP3475124A4|2016-06-28|2017-06-12|Method and control unit for a digital rear view mirror|
KR1020197001412A| KR102130059B1|2016-06-28|2017-06-12|Digital rearview mirror control unit and method|
[返回顶部]